1 IEOR 6711 : Continuous - Time Markov Chains

نویسنده

  • Karl Sigman
چکیده

A Markov chain in discrete time, {Xn : n ≥ 0}, remains in any state for exactly one unit of time before making a transition (change of state). We proceed now to relax this restriction by allowing a chain to spend a continuous amount of time in any state, but in such a way as to retain the Markov property. As motivation, suppose we consider the rat in the open maze. Clearly it is more realistic to be able to keep track of where the rat is at any continuous-time t ≥ 0 as oppposed to only where the rat is after n “steps”. Assume throughout that our state space is S = Z = {· · · ,−2,−1, 0, 1, 2, · · · } (or some subset thereof). Suppose now that whenever a chain enters state i ∈ S, independent of the past, the length of time spent in state i is a continuous, strictly positive (and proper) random variable Hi called the holding time in state i. When the holding time ends, the process then makes a transition into state j according to transition probability Pij, independent of the past, and so on. Letting X(t) denote the state at time t, we end up with a continuous-time stochastic process {X(t) : t ≥ 0} with state space S. Our objective is to place conditions on the holding times to ensure that the continuoustime process satisfies the Markov property: The future, {X(s + t) : t ≥ 0}, given the present state, X(s), is independent of the past, {X(u) : 0 ≤ u < s}. Such a process will be called a continuous-time Markvov chain (CTMC), and as we will conclude shortly, the holding times will have to be exponentially distributed. The formal definition is given by

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sigman 1 IEOR 4106 : Continuous - Time Markov Chains

A Markov chain in discrete time, {Xn : n ≥ 0}, remains in any state for exactly one unit of time before making a transition (change of state). We proceed now to relax this restriction by allowing a chain to spend a continuous amount of time in any state, but in such a way as to retain the Markov property. As motivation, suppose we consider the rat in the open maze. Clearly it is more realistic ...

متن کامل

1 IEOR 4701 : Continuous - Time Markov Chains

A Markov chain in discrete time, {Xn : n ≥ 0}, remains in any state for exactly one unit of time before making a transition (change of state). We proceed now to relax this restriction by allowing a chain to spend a continuous amount of time in any state, but in such a way as to retain the Markov property. As motivation, suppose we consider the rat in the open maze. Clearly it is more realistic ...

متن کامل

1 IEOR 6711 : Introduction to Renewal Theory

with tn−→∞ as n−→∞. With N(0) def = 0, N(t) max{n : tn ≤ t} denotes the number of points that fall in the interval (0, t], and {N(t) : t ≥ 0} is called the counting process for ψ. (If t1 > t, then N(t) def = 0.) If the tn are random variables then ψ is called a random point process. We sometimes allow a point at the origin and define t0 def = 0. Xn = tn − tn−1, n ≥ 1 is called the nth interarri...

متن کامل

Stochastic Dynamic Programming with Markov Chains for Optimal Sustainable Control of the Forest Sector with Continuous Cover Forestry

We present a stochastic dynamic programming approach with Markov chains for optimal control of the forest sector. The forest is managed via continuous cover forestry and the complete system is sustainable. Forest industry production, logistic solutions and harvest levels are optimized based on the sequentially revealed states of the markets. Adaptive full system optimization is necessary for co...

متن کامل

Quasistationary Distributions for Continuous-Time Markov Chains

Quasistationary Distributions for Continuous-Time Markov Chains – p.1

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009